Fast Nonnegative Least Squares Through Flexible Krylov Subspaces
نویسندگان
چکیده
Constrained least squares problems arise in a variety of applications, and many iterative methods are already available to compute their solutions. This paper proposes a new efficient approach to solve nonnegative linear least squares problems. The associated KKT conditions are leveraged to form an adaptively preconditioned linear system, which is then solved by a flexible Krylov subspace method. The new method can be easily applied to image reconstruction problems affected by both Gaussian and Poisson noise, where the components of the solution represent nonnegative intensities. Theoretical insight is given, and numerical experiments and comparisons are displayed in order to validate the new method, which delivers results of equal or better quality than many state-of-the-art methods for nonnegative least squares solvers, with a significant speedup.
منابع مشابه
A Subspace Approximation Method for the Quadratic Eigenvalue Problem
Quadratic eigenvalue problems involving large matrices arise frequently in areas such as the vibration analysis of structures, MEMS simulation, and the solution of quadratically constrained least squares problems. The typical approach is to solve the quadratic eigenvalue problem using a mathematically equivalent linearized formulation, resulting in a doubled dimension and a lack of backward sta...
متن کاملA Generalized Krylov Subspace Method for ℓp-ℓq Minimization
This paper presents a new efficient approach for the solution of the lp-lq minimization problem based on the application of successive orthogonal projections onto generalized Krylov subspaces of increasing dimension. The subspaces are generated according to the iteratively reweighted least-squares strategy for the approximation of lp/lq-norms by weighted l2-norms. Computed image restoration exa...
متن کاملResidual and Backward Error Bounds in Minimum Residual Krylov Subspace Methods
Minimum residual norm iterative methods for solving linear systems Ax = b can be viewed as, and are often implemented as, sequences of least squares problems involving Krylov subspaces of increasing dimensions. The minimum residual method (MINRES) [C. Paige and M. Saunders, SIAM J. Numer. Anal., 12 (1975), pp. 617–629] and generalized minimum residual method (GMRES) [Y. Saad and M. Schultz, SIA...
متن کاملOn Iterative Krylov-Dogleg Trust-Region Steps for Solving Neural Networks Nonlinear Least Squares Problems
This paper describes a method of dogleg trust-region steps, or restricted Levenberg-Marquardt steps, based on a projection process onto the Krylov subspaces for neural networks nonlinear least squares problems. In particular, the linear conjugate gradient (CG) method works as the inner iterative algorithm for solving the linearized Gauss-Newton normal equation, whereas the outer nonlinear algor...
متن کاملIterative Scaled Trust-Region Learning in Krylov Subspaces via Pearlmutter's Implicit Sparse Hessian-Vector Multiply
The online incremental gradient (or backpropagation) algorithm is widely considered to be the fastest method for solving large-scale neural-network (NN) learning problems. In contrast, we show that an appropriately implemented iterative batch-mode (or block-mode) learning method can be much faster. For example, it is three times faster in the UCI letter classification problem (26 outputs, 16,00...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM J. Scientific Computing
دوره 39 شماره
صفحات -
تاریخ انتشار 2017